43 research outputs found

    Developing a Coherent Cyberinfrastructure from Local Campus to National Facilities: Challenges and Strategies

    Get PDF
    A fundamental goal of cyberinfrastructure (CI) is the integration of computing hardware, software, and network technology, along with data, information management, and human resources to advance scholarship and research. Such integration creates opportunities for researchers, educators, and learners to share ideas, expertise, tools, and facilities in new and powerful ways that cannot be realized if each of these components is applied independently. Bridging the gap between the reality of CI today and its potential in the immediate future is critical to building a balanced CI ecosystem that can support future scholarship and research. This report summarizes the observations and recommendations from a workshop in July 2008 sponsored by the EDUCAUSE Net@EDU Campus Cyberinfrastructure Working Group (CCI) and the Coalition for Academic Scientific Computation (CASC). The invitational workshop was hosted at the University Place Conference Center on the IUPUI campus in Indianapolis. Over 50 individuals representing a cross-section of faculty, senior campus information technology leaders, national lab directors, and other CI experts attended. The workshop focused on the challenges that must be addressed to build a coherent CI from the local to the national level, and the potential opportunities that would result. Both the organizing committee and the workshop participants hope that some of the ideas, suggestions, and recommendations in this report will take hold and be implemented in the community. The goal is to create a better, more supportive, more usable CI environment in the future to advance both scholarship and research

    Image enhancement by nonlinear wavelet processing

    No full text
    Conference PaperIn this paper we describe how the theory of wavelet thresholding introduced by Donoho and Johnstone can successfully be applied to two distinct problems in image processing where traditional linear filtering techniques are insufficient. The first application is related to speckle reduction in coherent imaging systems. We show that the proposed method works well for reducing speckle in SAR images while maintaining bright reflections for subsequent processing and detection. Secondly we apply the wavelet based method for reducing blocking artifacts associated with most DCT based image coders (e.g., most notably the Joint Photographic Experts Group (JPEG) standard at high compression ratios). In particular we demonstrate an algorithm for post-processing decoded images without the need for a novel coder/decoder. By applying this algorithm we are able to obtain perceptually superior images at high compression ratios using the JPEG coding standard. For both applications we have developed methods for estimating the required threshold parameter and we have applied these to large number of images to study the effect of the wavelet thresholding. Our main goal with this paper is to illustrate how the recent theory of wavelet denoising can be applied to a wide range of practical problems which does not necessarily satisfy all the assumptions of the developed theory

    Moments, smoothness and optimization of wavelet systems

    No full text
    This dissertation develops several new results on wavelet design using both traditional and new optimization criteria. We consider three different design problems: two problems address explicit optimization of function properties, and the third problem addresses optimization of filter properties. The first design is a generalization of the maximally regular Daubechies wavelets in that the maximal number of zero wavelet moments are traded for a larger number of small wavelet moments. Most work on applying wavelets has shown that the Daubechies solution is remarkably robust and near optimal (among known wavelet systems) for a number of applications, yet one is frequently left wondering why. By introducing this new class of wavelets we feel that such questions can be properly addressed experimentally. The second class of wavelet considered is a spin on smoothness. It has long been believed that wavelet smoothness is important in a number of applications. However, this is primarily supported by experimental evidence, and furthermore, it is not clear what kind of smoothness is most important (differentiability, Fourier transform decay rate, finite scale). To address this we develop a measure of smoothness for designing wavelet basis that gives rise to smooth but non-differentiable wavelet systems. The new family of wavelets introduced here is based on optimization of finite scale smoothness and is achieved by introducing a measure called discrete finite variation. Using the new measure, several design examples of finite scale smooth wavelets are provided. Preliminary compression results also indicate that these wavelet systems perform at least as well as both the optimally smooth solutions and the Daubechies solution. The third and final design looks at least square optimal wavelet filters. We develop an algorithm using Lagrange multiplier theory for the design of optimal filters. The algorithm is both flexible and robust in that it can be used for designing wavelets using mixed constraints as well as provide solutions for long wavelet filters previously not possible. A common thread among the design algorithms discussed in this dissertation is the use of filter coefficients as the parameter space for optimization

    New class of wavelets for signal approximation

    No full text
    Conference PaperThis paper develops a new class of wavelets for which the classical Daubechies zero moment property has been relaxed. The advantages of relaxing higher order wavelet moment constraints is that within the framework of compact support and perfect reconstruction (orthogonal and biorthogonal) one can obtain wavelet basis with new and interesting approximation properties. This paper investigates a new class of wavelets that is obtained by setting a few lower order moments to zero and using the remaining degrees of freedom to minimize a larger number of higher order moments. The resulting wavelets are shown to be robust for representing a large classes of inputs. Robustness is achieved at the cost of exact representation of low order polynomials but with the advantage that higher order polynomials can be represented with less error compared to the maximally regular solution of the same support

    Discrete finite variation: A new measure of smoothness for the design of wavelet basis

    No full text
    Conference PaperA new method for measuring and designing smooth wavelet basis which dispenses with the need for having a large number of zero moments of the wavelet is given. The method is based on minimizing the "discrete finite variation", and is a measure of the local "roughness" of a <i>sampled</i> version of the scaling function giving rise to "visually smooth" wavelet basis. Smooth wavelet basis are deemed to be important for several applications and in particularly for image compression where the goal is to limit spurious artifacts due to non-smooth basis functions in the presence of quantization of the individual subbands. The definition of smoothness introduced here gives rise to new algorithms for designing smooth wavelet basis with only one vanishing moment leaving free parameters, otherwise used for setting moments to zero, for optimization

    Smooth biorthogonal wavelets for applications in image compression

    No full text
    Conference PaperIn this paper we introduce a new family of smooth, symmetric biorthogonal wavelet basis. The new wavelets are a generalization of the Cohen, Daubechies and Feauveau (CDF) biorthogonal wavelet systems. Smoothness is controlled independently in the analysis and synthesis bank and is achieved by optimization of the discrete finite variation (DFV) measure recently introduced for orthogonal wavelet design. The DFV measure dispenses with a measure of differentiability (for smoothness) which requires a large number of vanishing wavelet moments (e.g., Holder and Sobolev exponents) in favor of a smoothness measure that uses the fact that only a finite depth of the filter bank tree is involved in most practical applications. Image compression examples applying the new filters using the embedded wavelet zerotree (EZW) compression algorithm due to Shapiro shows that the new basis functions performs better when compared to the classical CDF 7/9 wavelet basis

    Time Frequency Analysis Applications in Geophysics

    No full text
    Book chapterIn this chapter, we overview a number of applications of time-frequency representations in seismic data processing, from the analysis of seismic sequences to efficient attribute extraction to 3-D attributes for volumetric data

    Design of Linear Phase Cosine Modulated Filter Banks for Subband Image Compression

    No full text
    Tech ReportWavelet methods give a flexible alternative to Fourier methods in non-stationary signal analysis. The concept of band-limitedness plays a fundamental role in Fourier analysis. Since wavelet theory replaces frequency with scale, a natural question is whether there exists a useful concept of scale-limitedness. Obvious definitions of scale-limitedness are too restrictive, in that there would be few or no useful scale-limited signals. This paper introduces a viable definition for scale-limited signals, and shows that the class is rich enough to include bandlimited signals, and impulse trains, among others. Moreover, for a wide choice of criteria, we show how to design the optimal wavelet for representing a given signal, and how to design robust wavelets that optimally represent certain classes of signals
    corecore